Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs


We are looking for computer science/engineering final year students/ fresh graduates that have solid understanding of computer science fundamentals (algorithms, data structures, object oriented programming) and strong java. programming skills. You will get to work on machine learning algorithms as applied to online advertising or do data analytics. You will learn how to collaborate in small, agile teams, do rapid development, testing and get to taste the invigorating feel of a start-up company.
Experience
None required
Required Skills
-Solid foundation in computer science, with strong competencies in data structures, algorithms, and software design
-Java / Python programming
-UI/UX HTML5 CSS3, Javascript
-MYSQL, Relational Databases
-MVC Framework, ReactJS
Optional Skills
-Familiarity with online advertising, web technologies
-Familiarity with Hadoop, Spark, Scala
Education
UG - B.Tech/B.E. - Computers; PG - M.Tech - Computers
Job Summary
Condé Nast is looking for a talented Software Quality Assurance Manager. In this role you will
be part of an IT organization and to provide direction and leadership on all quality matters
across the platform or domain. This is an opportunity to act as a key contributor in making sure
that work packages transition seamlessly from development into live environments Hone your
knowledge of quality best practices and collaboration and stakeholder management skills
Objectives of the Role
● Lead a global team of QE engineers who are responsible for ensuring E2E(end to end) quality
of our products
● Provide high direction for test teams delivering a clear and consistent vision
● Develop a roadmap for building needed test expertise. Establish and evolve formal QA
processes
● Define quality and automation KPI goals for the team and drive success towards achieving
them.
● To recommend, implement, and monitor preventative and corrective actions to ensure that
quality assurance standards are achieved
● Participate in brainstorming sessions and cross-departmental meetings to ensure collaborating
and cohesion
● Oversee all aspects of quality assurance including establishing metrics, implementing best
practices, and developing new tools and processes to ensure quality goals are met
● Act as a key point of contact for all QA aspects of releases, providing QA services and
coordinating QA resources internally and externally
● Review test strategies and test plans and ensure coverage for functional, performance and
scalability aspects are covered and tested
● Working cross functionally with Product and Engineering to ensure smooth delivery of product
that are on time, feature rich and with high quality
● Present QA status of testing with state-of-the-art dashboards and highlight key blockers and
where help is needed
● Ensure automated scripts are produced for E2E testing scenarios, including performance and
scale testing
● Review user found defects and continue to enhance and improve testing methodology and fill
gaps in testing as needed
● Communicate product status, key issues, and insights to key constituents across the
organization, including the executive team.
● Develop process for manual and automated testing strategy for business applications, web
applications and BI solutions.
● Experience in establishing testing strategy for dealing with quarterly releases by business
applications
Required Skills
● 12 + years’ experience in Product Quality testing role and 2-3 yrs in Management position
● Seasoned QA Manager with track record of delivering high quality products in a fast paced
environment
● Must have prior experience in building QA / Data Validation framework in bigdata for various
BU's
● Experience in "closing the loop" and "continuous improvement" of QA strategies based on
learnings from field issues
● Experience in business applications, web application, Business Intelligence applications
● Hands-on QA experience including testing automation. Deep knowledge of automation best
practices and industry trends.
● Good to have knowledge on One or more Test automation tools like xUnit , Selenium, Jmeter ,
Ranorex etc.
● Experience in automation testing using Selenium Webdriver, Testing Frameworks like TestNG
and NUnit.
● Strong proficiency in SQL coding (T-SQL or PL-SQL)
● Familiarity with data warehousing practices, ETL processing and denormalized data structures.
● Background in programming and testing across platforms is a plus
● Proficient with Bigdata concepts and/or tools is a plus (Hadoop, Hive, Spark).
● Strong analytical and reasoning skills
● Strong verbal, written communication skills and strong interpersonal skills.
About Condé Nast
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and social
platforms - in other words, a staggering amount of user data. Condé Nast made the right move
to invest heavily in understanding this data and formed a whole new Data team entirely
dedicated to data processing, engineering, analytics, and visualization. This team helps drive
engagement, fuel process innovation, further content enrichment, and increase market
revenue. The Data team aimed to create a company culture where data was the common
language and facilitate an environment where insights shared in real-time could improve
performance.
The Global Data team operates out of Los Angeles, New York, Chennai, and London. The team
at Condé Nast Chennai works extensively with data to amplify its brands' digital capabilities and
boost online revenue. We are broadly divided into four groups, Data Intelligence, Data
Engineering, Data Science, and Operations (including Product and Marketing Ops, Client
Services) along with Data Strategy and monetization. The teams built capabilities and products
to create data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create diverse
forms of self-expression. At Condé Nast, we encourage the imaginative and celebrate the
extraordinary. We are a media company for the future, with a remarkable past. We are Condé
Nast, and It Starts Here.
- Produce clean code and automated tests
- Align with enterprise architecture frameworks and standards
- Be the role-model for all engineers in the team in terms of technical competency
- Research, assess and adopt new technologies as required
- Be a guide and mentor to the team members and help in ramping up the overall skill-base of the team.
- Produce detailed estimates and optimized work plans for requirements and changes
- Ensure that features are delivered on time and that they meet the business needs
- Strive for quality of performance, usability, reliability, maintainability, and extensibility
- Identify opportunities for process and tool improvements
- Use analytical rigor to produce effective solutions to poorly defined problems
- Follow Build to Ship mantra in practice with full Dev Ops implementation
- 10+ years of core software development and product creation experience in CPaaS.
- Working knowledge in VoIP, communication API , J2EE, JMS/ Kafka, Web-Services, Hadoop, React, Node.js, GoLang.
- Working knowledge in Various CPaaS channels - SMS, voice, WhatsApp, RCS, Email.
- Working knowledge of DevOps, automation testing, test driven development, behavior driven development, server-less or micro-services
- Experience with AWS / Azure deployments
- Solid background in large scale software development.
- Full stack understanding of web/mobile/API/database development concepts and patterns
- Exposure to Microservices, Iaas, PaaS, service mesh, SaaS and cloud native application development.
- Understanding of Agile Scrum and SDLC principles.
- Containerization and orchestrations:- Dockers, kuberenetes, openshift, consule etc.
- Knowledge on NFV (openstack, Vsphere, Vcloud etc)
- Experience in Data Analytics/AI/ML or Marketing Tech domain is an added advantage
datasets
● Translate complex business requirements into scalable technical solutions meeting data design
standards. Strong understanding of analytics needs and proactive-ness to build generic solutions
to improve the efficiency
● Build dashboards using Self-Service tools on Kibana and perform data analysis to support
business verticals
● Collaborate with multiple cross-functional teams and work

We are looking for a Senior Python Developer to produce large scale distributed software solutions. You’ll be part of a cross-functional team that’s responsible for the complete software development life cycle, from conception to deployment.
If you’re also familiar with Agile methodologies, we’d like to meet you.
Responsibilities:
Work with development teams and product managers to ideate software solutions Design client-side and server-side architecture Build the front-end of applications through appealing visual design Develop and manage well-functioning databases and applications Write effective APIs Test software to ensure responsiveness and efficiency Troubleshoot, debug and upgrade software Create security and data protection settings Write technical documentation
Requirements
Proven experience as a Python Developer or similar role Knowledge on Python, Django, MongoDB, Elasticsearch, AWS Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind Experience on Apache Kafka, Hbase and Graph DB is an added bonus

bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
- Looking only for immediate to 15 days candidate.
- Looking for an experienced Integration specialist with a good expertise in ETL Informatica and a strong Application integration background
- Minimum of 3+ years relevant experience in Informatica MDM required. Powercenter is a core skill set.
- Having experience in a broader Informatica toolset is strongly preferred
- Should prove a very strong implementation experience in Application integration, should demonstrate expertise/presentation with multiple use cases
- Passionate coders with a strong Application development background, years of experience could range from 5+ to 15+
- Should have application development experience outside of ETL, (just learning ETL as a tool is not enough), experience in writing application outside of ETL will bring in more value
- Strong database skills with a strong understanding of data, data quality, data governance, understanding and developing standalone and integrated database layers (sql, packages, functions, performance tuning ), i.e. Expert with a strong integration background who has more application integration background than just an ETL Informatica tool
- Experience in integration with XML/JSON based and heavily involve JMS MQ (read/write)
- Experience in SOAP and REST based API's exchanging both XML and JSON files used for request and response
- Experience with Salesforce.com Integration using Informatica power exchange module is a plus but not needed
- Experience with Informatica MDM as a technology stack that is used for integration of senior market members with Salesforce.com is a plus but not needed,
- Very strong scripting background (C/bourne shell/Perl/Java)
- Should be able to understand JAVA, we do have development around JAVA, i.e ability to work around a solution in programming language like Java when implementation is not possible through ETL
- Ability to communicate effectively via multiple channels (verbal, written, etc.) with technical and non-technical staff.

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow

Dear Candidate,,
Greetings of the day!
As discussed, Please find the below job description.
Job Title : Hadoop developer
Experience : 3+ years
Job Location : New Delhi
Job type : Permanent
Knowledge and Skills Required:
Brief Skills:
Hadoop, Spark, Scala and Spark SQL
Main Skills:
- Strong experience in Hadoop development
- Experience in Spark
- Experience in Scala
- Experience in Spark SQL
Why OTSi!
Working with OTSi gives you the assurance of a successful, fast-paced career.
Exposure to infinite opportunities to learn and grow, familiarization with cutting-edge technologies, cross-domain experience and a harmonious environment are some of the prime attractions for a career-driven workforce.
Join us today, as we assure you 2000+ friends and a great career; Happiness begins at a great workplace..!
Feel free to refer this opportunity to your friends and associates.
About OTSI: (CMMI Level 3): Founded in 1999 and headquartered in Overland Park, Kansas, OTSI offers global reach and local delivery to companies of all sizes, from start-ups to Fortune 500s. Through offices across the US and around the world, we provide universal access to exceptional talent and innovative solutions in a variety of delivery models to reduce overall risk while optimizing outcomes & enabling our customers to thrive in a global economy.http://otsi-usa.com/?page_id=2806">
OTSI's global presence, scalable and sustainable world-class infrastructure, business continuity processes, ISO 9001:2000, CMMI 3 certifications makes us a preferred service provider for our clients. OTSI has the expertise in different technologies enhanced by our http://otsi-usa.com/?page_id=2933">partnerships and alliances with industry giants like HP, Microsoft, IBM, Oracle, and SAP and others. Highly repetitive local company with a proven success of serving the UAE Government IT needs is seeking to attract, employ and develop people with exceptional skills who want to make a difference in a challenging environment.Object Technology Solutions India Pvt Ltd is a leading Global Information Technology (IT) Services and Solutions company offering a wide array of Solutions for a range of key Verticals. The company is headquartered in Overland Park, Kansas, and has a strong presence in US, Europe and Asia-Pacific with a Global Delivery Center based in India. OTSI offers a broad range of IT application solutions and services including; e-Business solutions, Enterprise Resource Planning (ERP) implementation and Post Implementation Support, Application development, Application Maintenance, Software customizations services.
OTSI Partners & Practices
- SAP Partner
- Microsoft Silver Partner
- Oracle Gold Partner
- Microsoft CoE
- DevOps Consulting
- Cloud
- Mobile & IoT
- Digital Transformation
- Big data & Analytics
- Testing Solutions
OTSI Honor’s & Awards:
- #91 in Inc.5000 .
- Fastest growing IT Companies in Inc.5000…



